λ-Calculus: The Other Turing Machine

نویسندگان

  • Guy Blelloch
  • Robert Harper
چکیده

The early 1930s were bad years for the worldwide economy, but great years for what would eventually be called Computer Science. In 1932, Alonzo Church at Princeton described his λ-calculus as a formal system for mathematical logic,and in 1935 argued that any function on the natural numbers that can be effectively computed, can be computed with his calculus [4]. Independently in 1935, as a master’s student at Cambridge, Alan Turing was developing his machine model of computation. In 1936 he too argued that his model could compute all computable functions on the natural numbers, and showed that his machine and the λ-calculus are equivalent [6]. The fact that two such different models of computation calculate the same functions was solid evidence that they both represented an inherent class of computable functions. From this arose the so-called Church-Turing thesis, which states, roughly, that any function on the natural numbers can be effectively computed if and only if it can be computed with the λ-calculus, or equivalently, the Turing machine. Although the Church-Turing thesis by itself is one of the most important ideas in Computer Science, the influence of Church and Turing’s models go far beyond the thesis itself. The Turing machine has become the core of all complexity theory [5]. In the early 1960’s Juris Hartmanis and Richard Stearns initiated the study of the complexity of computation. In 1967 Manuel Blum developed his axiomatic theory of complexity, which although independent of any particular machine, was considered in the context of a spectrum of Turing machines (e.g., one tape, two tape, or two stack). This was followed in 1971 by Stephen Cooke and Leonid Levin who showed a complete problem for NP (satisfiability) and introducing the question of whether P = NP. Since then a huge number of complexity classes have been isolated and related, including PCP, the class of Probabilistically Checkable Proofs. All this work has been defined in terms of minor variants of the Turing machine. In the field of Algorithms, where analysis needs to be more precise, other models have been developed, such at the Random Access Machine (RAM), that are closer to physical computers while remaining relatively abstract and hence widely applicable. However even these machines were heavily influenced by the Turing machine, and most of the complexity classes defined for the Turing machine carry over to the RAM. Whereas the machine models became the foundation of complexity and algorithmic theory, it was Church’s λ-calculus that became the foundation for the theory of programming languages [3]. This came about largely under the influence of Dana Scott and Christopher Strachey who, at the suggestion of Roger Penrose, developed denotational semantics, a topologically influenced account of higher-order computations acting on infinite data objects such as functions and streams, that are inherent in Church’s formalism. Scott and Strachey’s work meshed with Church’s work on classical type theory as a foundation for mathematics, with L.E.J. Brouwer’s program of constructive foundations that led to Per Martin-Löf’s development of Intuitionistic Type Theory, and with N.G. de Bruijn’s AUTOMATH language for expressing machine-checked proof, all of which were similarly founded on the λ-calculus. The result is an integrated theory of computation and deduction, known as the Propositions as Types principle, that consolidates programs with proofs, and types with propositions. This principle lies at the heart of most programming language theory and many program verification and proof checking systems. Robin Milner’s work on the LCF prover in the 1970’s led to the emergence of functional programming, based directly on the λ-calculus, and interactive proof development, based on Scott’s logic of computable functions arising from his program of denota-

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Imperative and Functional Programming Paradigm

In Turing (1937) a characterization is given of those functions that can be computed using a mechanical device. Moreover it was shown that some precisely stated problems cannot be decided by such functions. In order to give evidence for the power of this model of computation, Turing showed in the same paper that machine computability has the same strength as definability via λ-calculus, introdu...

متن کامل

An Expanded Ambient Calculus with Π-calculus Embedding

Formal methods is the branch of computer science most concerned with the mathematical formalization of computational frameworks and the specification of programs and programming languages. The formulation of computability most familiar to an undergraduate is most likely the Turing machine, which models computation in a way intuitive to the layperson; a Turing machine consists of a malleable tap...

متن کامل

Encoding Turing Machines into the Deterministic Lambda-Calculus

1. Weakly strategy independent : the image of the encoding is a very small fragment of the λ-calculus, that we call the deterministic λ-calculus Λdet. Essentially, it is the CPS (continuation-passing style) λ-calculus restricted to weak evaluation (i.e., not under abstractions). In Λdet every term has at most one redex, and so all weak strategies collapse into a single deterministic evaluation ...

متن کامل

A Remote Programming Technology on a Remote VDM Clustering in λ-Calculus

Knowledge based object programming tools were studied [1-7]. Formal technology dependent grammar differentiation and integration tools were developed and used to construct a contemporary science frame on an arbitrary u knowledge based object [8-9]. Tasım clustering tools were studied [1-14]. λ-Calculus, Turing machine and abstract computation are well known as in [15,16,17] etc. This paper: (a)...

متن کامل

An Intensional Concurrent Faithful Encoding of Turing Machines

The benchmark for computation is typically given as Turing computability; the ability for a computation to be performed by a Turing Machine. Many languages exploit (indirect) encodings of Turing Machines to demonstrate their ability to support arbitrary computation. However, these encodings are usually by simulating the entire Turing Machine within the language, or by encoding a language that d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015